翻訳と辞書
Words near each other
・ "O" Is for Outlaw
・ "O"-Jung.Ban.Hap.
・ "Ode-to-Napoleon" hexachord
・ "Oh Yeah!" Live
・ "Our Contemporary" regional art exhibition (Leningrad, 1975)
・ "P" Is for Peril
・ "Pimpernel" Smith
・ "Polish death camp" controversy
・ "Pro knigi" ("About books")
・ "Prosopa" Greek Television Awards
・ "Pussy Cats" Starring the Walkmen
・ "Q" Is for Quarry
・ "R" Is for Ricochet
・ "R" The King (2016 film)
・ "Rags" Ragland
・ ! (album)
・ ! (disambiguation)
・ !!
・ !!!
・ !!! (album)
・ !!Destroy-Oh-Boy!!
・ !Action Pact!
・ !Arriba! La Pachanga
・ !Hero
・ !Hero (album)
・ !Kung language
・ !Oka Tokat
・ !PAUS3
・ !T.O.O.H.!
・ !Women Art Revolution


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

quantum relative entropy : ウィキペディア英語版
quantum relative entropy
In quantum information theory, quantum relative entropy is a measure of distinguishability between two quantum states. It is the quantum mechanical analog of relative entropy.
== Motivation ==

For simplicity, it will be assumed that all objects in the article are finite-dimensional.
We first discuss the classical case. Suppose the probabilities of a finite sequence of events is given by the probability distribution ''P'' = , but somehow we mistakenly assumed it to be ''Q'' = . For instance, we can mistake an unfair coin for a fair one. According to this erroneous assumption, our uncertainty about the ''j''-th event, or equivalently, the amount of information provided after observing the ''j''-th event, is
:\; - \log q_j.
The (assumed) average uncertainty of all possible events is then
:\; - \sum_j p_j \log q_j.
On the other hand, the Shannon entropy of the probability distribution ''p'', defined by
:\; - \sum_j p_j \log p_j,
is the real amount of uncertainty before observation. Therefore the difference between these two quantities
:\; - \sum_j p_j \log q_j - \left(- \sum_j p_j \log p_j\right) = \sum_j p_j \log p_j - \sum_j p_j \log q_j
is a measure of the distinguishability of the two probability distributions ''p'' and ''q''. This is precisely the classical relative entropy, or Kullback–Leibler divergence:
:D_ \!.
Note
#In the definitions above, the convention that 0·log 0 = 0 is assumed, since lim''x'' → 0 ''x'' log ''x'' = 0. Intuitively, one would expect that an event of zero probability to contribute nothing towards entropy.
#The relative entropy is not a metric. For example, it is not symmetric. The uncertainty discrepancy in mistaking a fair coin to be unfair is not the same as the opposite situation.

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「quantum relative entropy」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.